AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Mixture of Experts Fine-tuning

# Mixture of Experts Fine-tuning

Pantheon Proto RP 1.8 30B A3B
Apache-2.0
A Mixture of Experts (MoE) role-playing model based on Qwen3-30B-A3B-Base, supporting multi-role precision acting and diverse interactive experiences
Large Language Model English
P
Gryphe
596
18
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase